22,203 research outputs found

    The sensitivity of landscape evolution models to spatial and temporal rainfall resolution

    Get PDF
    © Author(s) 2016. Climate is one of the main drivers for landscape evolution models (LEMs), yet its representation is often basic with values averaged over long time periods and frequently lumped to the same value for the whole basin. Clearly, this hides the heterogeneity of precipitation - but what impact does this averaging have on erosion and deposition, topography, and the final shape of LEM landscapes? This paper presents results from the first systematic investigation into how the spatial and temporal resolution of precipitation affects LEM simulations of sediment yields and patterns of erosion and deposition. This is carried out by assessing the sensitivity of the CAESAR-Lisflood LEM to different spatial and temporal precipitation resolutions - as well as how this interacts with different-size drainage basins over short and long timescales. A range of simulations were carried out, varying rainfall from 0.25 h × 5 km to 24 h × Lump resolution over three different-sized basins for 30-year durations. Results showed that there was a sensitivity to temporal and spatial resolution, with the finest leading to & gt; 100 % increases in basin sediment yields. To look at how these interactions manifested over longer timescales, several simulations were carried out to model a 1000-year period. These showed a systematic bias towards greater erosion in uplands and deposition in valley floors with the finest spatial- and temporal-resolution data. Further tests showed that this effect was due solely to the data resolution, not orographic factors. Additional research indicated that these differences in sediment yield could be accounted for by adding a compensation factor to the model sediment transport law. However, this resulted in notable differences in the topographies generated, especially in third-order and higher streams. The implications of these findings are that uncalibrated past and present LEMs using lumped and time-averaged climate inputs may be under-predicting basin sediment yields as well as introducing spatial biases through under-predicting erosion in first-order streams but over-predicting erosion in second- and third-order streams and valley floor areas. Calibrated LEMs may give correct sediment yields, but patterns of erosion and deposition will be different and the calibration may not be correct for changing climates. This may have significant impacts on the modelled basin profile and shape from long-timescale simulations

    X-ray interferometry with transmissive beam combiners for ultra-high angular resolution astronomy

    Full text link
    Interferometry provides one of the possible routes to ultra-high angular resolution for X-ray and gamma-ray astronomy. Sub-micro-arc-second angular resolution, necessary to achieve objectives such as imaging the regions around the event horizon of a super-massive black hole at the center of an active galaxy, can be achieved if beams from parts of the incoming wavefront separated by 100s of meters can be stably and accurately brought together at small angles. One way of achieving this is by using grazing incidence mirrors. We here investigate an alternative approach in which the beams are recombined by optical elements working in transmission. It is shown that the use of diffractive elements is a particularly attractive option. We report experimental results from a simple 2-beam interferometer using a low-cost commercially available profiled film as the diffractive elements. A rotationally symmetric filled (or mostly filled) aperture variant of such an interferometer, equivalent to an X-ray axicon, is shown to offer a much wider bandpass than either a Phase Fresnel Lens (PFL) or a PFL with a refractive lens in an achromatic pair. Simulations of an example system are presented.Comment: To be published in "Experimental Astronomy

    Non-response biases in surveys of schoolchildren: the case of the English Programme for International Student Assessment (PISA) samples

    Get PDF
    We analyse response patterns to an important survey of schoolchildren, exploiting rich auxiliary information on respondents' and non-respondents' cognitive ability that is correlated both with response and the learning achievement that the survey aims to measure. The survey is the Programme for International Student Assessment (PISA), which sets response thresholds in an attempt to control the quality of data. We analyse the case of England for 2000, when response rates were deemed sufficiently high by the organizers of the survey to publish the results, and 2003, when response rates were a little lower and deemed of sufficient concern for the results not to be published. We construct weights that account for the pattern of non-response by using two methods: propensity scores and the generalized regression estimator. There is clear evidence of biases, but there is no indication that the slightly higher response rates in 2000 were associated with higher quality data. This underlines the danger of using response rate thresholds as a guide to quality of data

    External loading determines specific ECM genes regulation

    Get PDF
    Bio artificial matrices embedded with cells are simulated in bioreactors to facilitate ECM production. As cells attach, they develop forces, which are dependent on cell type and matrix stiffness. External forces (i.e strain), however, are critical for tissue homeostasis and elicit specific cellular responses, such as gene expression and protein production. Collagen Type I is a widely used scaffold in Tissue engineering. The aim of this study was to study the mechanical and molecular responses, of different cell types to increasing collagen substrate stiffness

    The Fantastic Four: A plug 'n' play set of optimal control pulses for enhancing nmr spectroscopy

    Full text link
    We present highly robust, optimal control-based shaped pulses designed to replace all 90{\deg} and 180{\deg} hard pulses in a given pulse sequence for improved performance. Special attention was devoted to ensuring that the pulses can be simply substituted in a one-to-one fashion for the original hard pulses without any additional modification of the existing sequence. The set of four pulses for each nucleus therefore consists of 90{\deg} and 180{\deg} point-to-point (PP) and universal rotation (UR) pulses of identical duration. These 1 ms pulses provide uniform performance over resonance offsets of 20 kHz (1H) and 35 kHz (13C) and tolerate reasonably large radio frequency (RF) inhomogeneity/miscalibration of (+/-)15% (1H) and (+/-)10% (13C), making them especially suitable for NMR of small-to-medium-sized molecules (for which relaxation effects during the pulse are negligible) at an accessible and widely utilized spectrometer field strength of 600 MHz. The experimental performance of conventional hard-pulse sequences is shown to be greatly improved by incorporating the new pulses, each set referred to as the Fantastic Four (Fanta4).Comment: 28 pages, 19 figure

    The Efficiency Gains from Dynamic Tax Reform

    Get PDF
    This paper presents a new simulation methodology for determining the pure efficiency gains from tax reform along the general. equilibrium rational expectations growth path of life cycle economies. The principal findings concern the effects of switching from a proportional income tax with rates similar to those in the U.S. to either a proportional tax on consumption or a proportional tax on labor income. A switch to consumption taxation generates a sustainable welfare gain of almost 2 percent of lifetime resources. In contrast, a transition to wage taxation generates a loss of greater than ? percent of lifetime re- sources. A second general result is that even a mild degree of progressivity in the income tax system imposes a very large efficiency cost.

    Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms

    Get PDF
    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action
    corecore